# Modular Design

Pal B Large Opt 350m
MIT
This model is a personalized reward model for diverse alignment, trained based on facebook/opt-350m for text summarization tasks.
Text Generation Transformers English
P
daiweichen
37
1
Molm 700M 4B
Apache-2.0
MoLM is a series of language models based on the Mixture of Experts (MoE) architecture. The 700M-4B version has a total of 4 billion parameters, with computational consumption equivalent to a dense model of 700 million parameters.
Large Language Model Transformers
M
ibm-research
36
6
Kecerdasan Buatan
Gpl-3.0
An open-source AI project based on the GPL-3.0 license, with specific functionalities depending on the model type
Large Language Model Transformers
K
Yuuki0
18
0
Resnet26
Apache-2.0
ResNet26 is an image classification model based on the deep residual learning architecture, a variant in the ResNet series.
Image Classification Transformers
R
glasses
14
0
Resnet152
Apache-2.0
ResNet152 is an image classification model based on deep residual learning, which solves the gradient vanishing problem in deep network training through residual connections.
Image Classification Transformers
R
glasses
14
0
Resnet34
Apache-2.0
ResNet34 is a convolutional neural network architecture based on deep residual learning, specifically designed for image classification tasks.
Image Classification Transformers
R
glasses
15
0
Resnet50
Apache-2.0
ResNet50 is an image classification model based on deep residual learning, which solves the vanishing gradient problem in deep neural networks through residual connections.
Image Classification Transformers
R
glasses
13
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase